52 research outputs found

    XQuery Streaming by Forest Transducers

    Full text link
    Streaming of XML transformations is a challenging task and only very few systems support streaming. Research approaches generally define custom fragments of XQuery and XPath that are amenable to streaming, and then design custom algorithms for each fragment. These languages have several shortcomings. Here we take a more principles approach to the problem of streaming XQuery-based transformations. We start with an elegant transducer model for which many static analysis problems are well-understood: the Macro Forest Transducer (MFT). We show that a large fragment of XQuery can be translated into MFTs --- indeed, a fragment of XQuery, that can express important features that are missing from other XQuery stream engines, such as GCX: our fragment of XQuery supports XPath predicates and let-statements. We then rely on a streaming execution engine for MFTs, one which uses a well-founded set of optimizations from functional programming, such as strictness analysis and deforestation. Our prototype achieves time and memory efficiency comparable to the fastest known engine for XQuery streaming, GCX. This is surprising because our engine relies on the OCaml built in garbage collector and does not use any specialized buffer management, while GCX's efficiency is due to clever and explicit buffer management.Comment: Full version of the paper in the Proceedings of the 30th IEEE International Conference on Data Engineering (ICDE 2014

    FILM COATING ONTO COHESIVE FINE PARTICLES BY A NOVEL ROTATING FLUIDIZED BED COATER

    Get PDF
    In this study, film coating onto cohesive fine particles was conducted by using a novel rotating fluidized bed coater (RFBC). In order to avoid the formation of agglomeration, baffle plates were equipped inside the RFBC. Coating experiments were conducted under various operating conditions and the coated particles were evaluated based on their physical properties. As a result, coated particles having extremely small degree of agglomeration and favorable prolonged release property of a tracer material could be obtained

    Synthesis of Nitrogen-Doped Carbon Nanocoils with Adjustable Morphology using Ni–Fe Layered Double Hydroxides as Catalyst Precursors

    Get PDF
    Nitrogen-doped carbon nanocoils (CNCs) with adjusted morphologies were synthesized in a one-step catalytic chemical vapour deposition (CVD) process using acetonitrile as the carbon and nitrogen source. The nickel iron oxide/nickel oxide nanocomposites, which were derived from nickel–iron layered double hydroxide (LDH) precursors, were employed as catalysts for the synthesis of CNCs. In this method, precursor-to-catalyst transformation, catalyst activation, formation of CNCs, and nitrogen doping were all performed in situ in a single process. The morphology (coil diameter, coil pitch, and fibre diameter) and nitrogen content of the synthesized CNCs was individually adjusted by modulation of the catalyst composition and CVD reaction temperature, respectively. The adjustable ranges of the coil diameter, coil pitch, fibre diameter, and nitrogen content were confirmed to be approximately 500±100 nm, 600±100 nm, 100±20 nm, and 1.1±0.3 atom%, respectively

    Surfactant-free solution synthesis of fluorescent platinum subnanoclusters

    Get PDF
    We have demonstrated the first surfactant-free synthesis of fluorescent Pt nanoclusters in N,N-dimethylformamide (DMF) solution. The Pt nanoclusters consist of 4 to 6 Pt atoms. They form highly stable dispersions in water, under both acidic (pH 2) and basic conditions (pH 12), and at ionic strengths of 1 M NaCl

    A comprehensive validation of very early rule-out strategies for non-ST-segment elevation myocardial infarction in emergency departments:protocol for a multicentre prospective cohort study

    Get PDF
    Introduction: Recent advances in troponin sensitivity enabled early and accurate judgement of ruling-out myocardial infarction, especially non-ST elevation myocardial infarction (NSTEMI) in emergency departments (EDs) with development of various prediction-rules and high-sensitive-troponin-based strategies (hs-troponin). Reliance on clinical impression, however, is still common, and it remains unknown which of these strategies is superior. Therefore, our objective in this prospective cohort study is to comprehensively validate the diagnostic accuracy of clinical impression-based strategies, prediction-rules and hs-troponin-based strategies for ruling-out NSTEMIs. Methods and analysis: In total, 1500 consecutive adult patients with symptoms suggestive of acute coronary syndrome will be prospectively recruited from five EDs in two tertiary-level, two secondary-level community hospitals and one university hospital in Japan. The study has begun in July 2018, and recruitment period will be about 1 year. A board-certified emergency physician will complete standardised case report forms, and independently perform a clinical impression-based risk estimation of NSTEMI. Index strategies to be compared will include the clinical impression-based strategy; prediction rules and hs-troponin-based strategies for the following types of troponin (Roche Elecsys hs-troponin T; Abbott ARCHITECT hs-troponin I; Siemens ADVIA Centaur hs-troponin I; Siemens ADVIA Centaur sensitive-troponin I). The reference standard will be the composite of type 1 MI and cardiac death within 30 days after admission to the ED. Outcome measures will be negative predictive value, sensitivity and effectiveness, defined as the proportion of patients categorised as low risk for NSTEMI. We will also evaluate inter-rater reliability of the clinical impression-based risk estimation. Ethics and dissemination: The study is approved by the Ethics Committees of the Kyoto University Graduate School and Faculty of Medicine and of the five hospitals where we will recruit patients. We will disseminate the study results through conference presentations and peer-reviewed journals

    コウダイレンケイ ニ ヨル キョウイク コウリュウ ネットワーク ノ コウチク : コミュニケーション キョウイク ケンキュウカイ ノ カツドウ ト コミュニケーションリテラシー

    Get PDF
    高大連携校より本学への入学が決定した生徒を対象とした単位認定科目「コミュニケーションリテラシー」を開講した。同科目の内容やスケジュールは、平成19 年度よりほぼ月に一度のペースで高大連携協定校教員と本学教職員で構成する「コミュニケーション教育研究会」で調査研究・見学・参観・討議などを重ねて決定されたものである。この研究会の中ではまた、ブレインストーミングにより、経済産業省の「社会人基礎能力」を基本とした「職業を持った社会人に必要とされる能力要素」の洗い出しを行ない、これら要素の必要性や獲得度のアンケート調査を実施した。対象は高大連携校と全国の高校である。その結果、職業をもった社会人にもっとも必要とされる基本的な基礎能力は、「コミュニケーション能力」と「一般常識」であること、さらには「課題発見能力や企画・提案能力」や「表現力やプレゼンテーション能力」が不十分であるととらえられていることがわかった。こうした問題意識を踏まえて「コミュニケーションリテラシー」の実施内容を組み立てた。本年度の「コミュニケーションリテラシー」受講生は110 名となった。この効果を検証することは容易ではないが、受講生に対しアンケート調査などを行った結果に関しても報告する。Novel coordination program, namely, "Communication Literacy", consisting of 15 lesson classes, was developed and held for secondary school third grade students. This class was developed and implemented in a project set up with members from secondary school teachers and Shohoku College teachers and staff. Survey on basic vocational abilities necessary for the graduates was made by sending questionnaires to secondary school teachers. Questionnaires were made to students participated in the program to clarify the effect of the program

    コウダイ レンケイ ニ ヨル セツゾクキョウイク プログラム カイハツ ノ ココロミ

    Get PDF
    高大連携により効率的な接続教育プログラムを開発するため,高校向けアンケートを実施した。職業に就いた際に必要とされるキャリア基礎能力として本学で過去実施したアンケート結果を基にした10 要素,ならびに経済産業省の「社会人基礎力」12 要素の必要性と達成度を設問とした。その結果,社会人にもっとも必要とされるキャリア基礎能力は「コミュニケーション能力」であること,また,高校教科「情報」によりPC 基礎操作能力は十分獲得されていること,がわかった。この結果を踏まえ,高大連携プログラムの一環として高校-大学教員によるコミュニケーション教育研究会を開催し,全12 講からなる接続教育プログラム「コミュニケーションリテラシー」を開講した。Enquiries were made to upper secondary school teachers on basic vocational abilities necessary for thegraduates. Since "information literacy" was introduced recently as compulsory subject in secondary schools, itsrelation with elements necessary for an ordinary person who wishes to work in a company was investigated. Basedon the results, a project was set up with members from secondary school teachers and Shohoku College teachersand library staff. As a result, a novel coordination program, namely, "communication literacy", consisting of 12lessons, was held for secondary school third grade students

    Extraction of Lexical Translations from Non-Aligned Corpora

    No full text
    A method for extracting lexical translations from non-aligned corpora is proposed to cope with the unavailability of large aligned corpus. The assumption that "translations of two co-occurring words in a source language also co-occur in the target language" is adopted and represented in the stochastic matrix formulation. The translation matrix provides the co-occurring information translated from the source into the target. This translated co-occurring information should resemble that of the original in the target when the ambiguity of the translational relation is resolved. An algorithm to obtain the best translation matrix is introduced. Some experiments were performed to evaluate the effectiveness of the ambiguity resolution and the refinement of the dictionary. 1 Introduction Alignment of corpora is now being actively studied to support example-based automatic translation and dictionary refinement. Focusing on the latter, in order to obtain lexical translations..

    A library of constructive skeletons for sequential style of parallel programming

    No full text
    With the increasing popularity of parallel programming environments such as PC clusters, more and more sequential programmers, with little knowledge about parallel architectures and parallel programming, are hoping to write parallel programs. Numerous attempts have been made to develop high-level parallel programming libraries that use abstraction to hide low-level concerns and reduce difficulties in parallel programming. Among them, libraries of parallel skeletons have emerged as a promising way towards this direction. Unfortunately, these libraries are not well accepted by sequential programmers, because of incomplete elimination of lower-level details, ad-hoc selection of library functions, unsatisfactory performance, or lack of convincing application examples. This paper addresses principle of designing skeleton libraries of parallel programming and reports implementation details and practical applications of a skeleton library SkeTo. The SkeTo library is unique in its feature that it has a solid theoretical foundation based on the theory of Constructive Algorithmics, and is practical to be used to describe various parallel computations in a sequential manner. 1
    corecore